1,129 research outputs found

    Propfan test assessment testbed aircraft stability and control/performance 1/9-scale wind tunnel tests

    Get PDF
    One-ninth scale wind tunnel model tests of the Propfan Test Assessment (PTA) aircraft were performed in three different NASA facilities. Wing and propfan nacelle static pressures, model forces and moments, and flow field at the propfan plane were measured in these tests. Tests started in June 1985 and were completed in January 1987. These data were needed to assure PTA safety of flight, predict PTA performance, and validate analytical codes that will be used to predict flow fields in which the propfan will operate

    Emissions from a HGV Using Used Cooking Oil as a Fuel under Real World Driving Conditions

    Get PDF
    To maximize CO2 reduction, refined straight used cooking oils were used as a fuel in Heavy Goods Vehicles (HGVs) in this research. The fuel is called C2G Ultra Biofuel (C2G: Convert to Green Ltd) and is a fully renewable fuel made as a diesel replacement from processed used cooking oil, used directly in diesel engines specifically modified for this purpose. This is part of a large demonstration project involving ten 44-tonne trucks using C2G Ultra Biofuel as a fuel to partially replace standard diesel fuels. A dual fuel tank containing both diesel and C2G Ultra Biofuel and an on-board fuel blending system-Bioltec system was installed on each vehicle, which is able to heat the C2G Ultra Biofuel and automatically determine the required blending ratio of diesel and C2G Ultra Biofuel according to fuel temperature and engine load. The engine was started with diesel and then switched to C2G Ultra Biofuel under appropriate conditions. Exhaust emissions were measured using PEMS (Portable Emission Measurement Systems) on one of the trucks under real world driving conditions. Comparisons of emissions between neat diesel mode and blended fuel mode were made. The results show that C2G Ultra Biofuel can reduce particulate matter (PM) and CO emissions significantly compared to the use of pure diesel

    Systematic reduction of complex tropospheric chemical mechanisms, Part I: sensitivity and time-scale analyses

    Get PDF
    International audienceExplicit mechanisms describing the complex degradation pathways of atmospheric volatile organic compounds (VOCs) are important, since they allow the study of the contribution of individual VOCS to secondary pollutant formation. They are computationally expensive to solve however, since they contain large numbers of species and a wide range of time-scales causing stiffness in the resulting equation systems. This paper and the following companion paper describe the application of systematic and automated methods for reducing such complex mechanisms, whilst maintaining the accuracy of the model with respect to important species and features. The methods are demonstrated via application to version 2 of the Leeds Master Chemical Mechanism. The methods of Jacobian analysis and overall rate sensitivity analysis proved to be efficient and capable of removing the majority of redundant reactions and species in the scheme across a wide range of conditions relevant to the polluted troposphere. The application of principal component analysis of the rate sensitivity matrix was computationally expensive due to its use of the decomposition of very large matrices, and did not produce significant reduction over and above the other sensitivity methods. The use of the quasi-steady state approximation (QSSA) proved to be an extremely successful method of removing the fast time-scales within the system, as demonstrated by a local perturbation analysis at each stage of reduction. QSSA species were automatically selected via the calculation of instantaneous QSSA errors based on user-selected tolerances. The application of the QSSA led to the removal of a large number of alkoxy radicals and excited Criegee bi-radicals via reaction lumping. The resulting reduced mechanism was shown to reproduce the concentration profiles of the important species selected from the full mechanism over a wide range of conditions, including those outside of which the reduced mechanism was generated. As a result of a reduction in the number of species in the scheme of a factor of 2, and a reduction in stiffness, the computational time required for simulations was reduced by a factor of 4 when compared to the full scheme

    Systematic reduction of complex tropospheric chemical mechanisms, Part II: Lumping using a time-scale based approach

    Get PDF
    This paper presents a formal method of species lumping that can be applied automatically to intermediate compounds within detailed and complex tropospheric chemical reaction schemes. The method is based on grouping species with reference to their chemical lifetimes and reactivity structures. A method for determining the forward and reverse transformations between individual and lumped compounds is developed. Preliminary application to the Leeds Master Chemical Mechanism (MCMv2.0) has led to the removal of 734 species and 1777 reactions from the scheme, with minimal degradation of accuracy across a wide range of test trajectories relevant to polluted tropospheric conditions. The lumped groups are seen to relate to groups of peroxy acyl nitrates, nitrates, carbonates, oxepins, substituted phenols, oxeacids and peracids with similar lifetimes and reaction rates with OH. In combination with other reduction techniques, such as sensitivity analysis and the application of the quasi-steady state approximation (QSSA), a reduced mechanism has been developed that contains 35% of the number of species and 40% of the number of reactions compared to the full mechanism. This has led to a speed up of a factor of 8 in terms of computer calculation time within box model simulations

    Systematic reduction of complex tropospheric chemical mechanisms using sensitivity and time-scale analyses

    No full text
    International audienceExplicit mechanisms describing the complex degradation pathways of atmospheric volatile organic compounds (VOCs) are important, since they allow the study of the contribution of individual VOCS to secondary pollutant formation. They are computationally expensive to solve however, since they contain large numbers of species and a wide range of time-scales causing stiffness in the resulting equation systems. This paper and the following companion paper describe the application of systematic and automated methods for reducing such complex mechanisms, whilst maintaining the accuracy of the model with respect to important species and features. The methods are demonstrated via application to version 2 of the Leeds Master Chemical Mechanism. The methods of local concentration sensitivity analysis and overall rate sensitivity analysis proved to be efficient and capable of removing the majority of redundant reactions and species in the scheme across a wide range of conditions relevant to the polluted troposphere. The application of principal component analysis of the rate sensitivity matrix was computationally expensive due to its use of the decomposition of very large matrices, and did not produce significant reduction over and above the other sensitivity methods. The use of the quasi-steady state approximation (QSSA) proved to be an extremely successful method of removing the fast time-scales within the system, as demonstrated by a local perturbation analysis at each stage of reduction. QSSA species were automatically selected via the calculation of instantaneous QSSA errors based on user-selected tolerances. The application of the QSSA led to the removal of a large number of alkoxy radicals and excited Criegee bi-radicals via reaction lumping. The resulting reduced mechanism was shown to reproduce the concentration profiles of the important species selected from the full mechanism over a wide range of conditions, including those outside of which the reduced mechanism was generated. As a result of a reduction in the number of species in the scheme of a factor of 2, and a reduction in stiffness, the computational time required for simulations was reduced by a factor of 4 when compared to the full scheme

    Current practice of preparing morphine infusions for nurse/patient-controlled analgesia in a UK paediatric hospital: healthcare professionals' views and experiences

    Get PDF
    Objective To explore the views and experiences of healthcare professionals (HCPs) regarding the preparation of morphine infusions for nurse/patient-controlled analgesia (N/PCA). Methods Three focus groups were conducted with HCPs (anaesthetists, nurses in theatres and wards) at one UK children's hospital. Focus groups were transcribed verbatim and content analysis was used to identify themes. Results A variety of approaches are used to prepare morphine infusions. A lack of appreciation of the excess volume present in morphine ampoules that nominally contain 1 or 2 mL was identified. Other sources of error were miscalculation, complexity of the multistep procedure, distractions and time pressure. Participants suggested that ‘ready-to-use’ prefilled syringes and preprogrammed syringe pumps would improve practice and minimise the risk of error. Conclusions Risks associated with the preparation of infusions for paediatric N/PCA, in particular non-appreciation of the overage (excess volume) in morphine ampoules, raise concerns about the accuracy of current practices

    A New Simulation Metric to Determine Safe Environments and Controllers for Systems with Unknown Dynamics

    Full text link
    We consider the problem of extracting safe environments and controllers for reach-avoid objectives for systems with known state and control spaces, but unknown dynamics. In a given environment, a common approach is to synthesize a controller from an abstraction or a model of the system (potentially learned from data). However, in many situations, the relationship between the dynamics of the model and the \textit{actual system} is not known; and hence it is difficult to provide safety guarantees for the system. In such cases, the Standard Simulation Metric (SSM), defined as the worst-case norm distance between the model and the system output trajectories, can be used to modify a reach-avoid specification for the system into a more stringent specification for the abstraction. Nevertheless, the obtained distance, and hence the modified specification, can be quite conservative. This limits the set of environments for which a safe controller can be obtained. We propose SPEC, a specification-centric simulation metric, which overcomes these limitations by computing the distance using only the trajectories that violate the specification for the system. We show that modifying a reach-avoid specification with SPEC allows us to synthesize a safe controller for a larger set of environments compared to SSM. We also propose a probabilistic method to compute SPEC for a general class of systems. Case studies using simulators for quadrotors and autonomous cars illustrate the advantages of the proposed metric for determining safe environment sets and controllers.Comment: 22nd ACM International Conference on Hybrid Systems: Computation and Control (2019

    Systematic lumping of complex tropospheric chemical mechanisms using a time-scale based approach

    No full text
    International audienceThis paper presents a formal method of species lumping that can be applied automatically to intermediate compounds within detailed and complex tropospheric chemical reaction schemes. The method is based on grouping species with reference to their chemical lifetimes and reactivity structures. A method for determining the forward and reverse transformations between individual and lumped compounds is developed. Preliminary application to the Leeds Master Chemical Mechanism (MCMv2.0) has led to the removal of 734 species and 1777 reactions from the scheme, with minimal degradation of accuracy across a wide range of test trajectories relevant to polluted tropospheric conditions. The lumped groups are seen to relate to groups of peroxy acyl nitrates, nitrates, carbonates, oxepins, substituted phenols, oxeacids and peracids with similar lifetimes and reaction rates with OH. In combination with other reduction techniques, such as sensitivity analysis and the application of the quasi-steady state approximation (QSSA), a reduced mechanism has been developed that contains 35% of the number of species and 40% of the number of reactions compared to the full mechanism. This has led to a speed up of a factor of 8 in terms of computer calculation time within box model simulations

    Ground state magnetic dipole moment of 35K

    Full text link
    The ground state magnetic moment of 35K has been measured using the technique of nuclear magnetic resonance on beta-emitting nuclei. The short-lived 35K nuclei were produced following the reaction of a 36Ar primary beam of energy 150 MeV/nucleon incident on a Be target. The spin polarization of the 35K nuclei produced at 2 degrees relative to the normal primary beam axis was confirmed. Together with the mirror nucleus 35S, the measurement represents the heaviest T = 3/2 mirror pair for which the spin expectation value has been obtained. A linear behavior of gp vs. gn has been demonstrated for the T = 3/2 known mirror moments and the slope and intercept are consistent with the previous analysis of T = 1/2 mirror pairs.Comment: 14 pages, 5 figure
    • …
    corecore